114 research outputs found
Recommended from our members
SPAR-H Step-by-Step Guidance
Step-by-step guidance was developed recently at Idaho National Laboratory for the US Nuclear Regulatory Commission on the use of the Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) method for quantifying Human Failure Events (HFEs). This work was done to address SPAR-H user needs, specifically requests for additional guidance on the proper application of various aspects of the methodology. This paper overviews the steps of the SPAR-H analysis process and highlights some of the most important insights gained during the development of the step-by-step directions. This supplemental guidance for analysts is applicable when plant-specific information is available, and goes beyond the general guidance provided in existing SPAR-H documentation. The steps highlighted in this paper are: Step-1, Categorizing the HFE as Diagnosis and/or Action; Step-2, Rate the Performance Shaping Factors; Step-3, Calculate PSF-Modified HEP; Step-4, Accounting for Dependence, and; Step-5, Minimum Value Cutoff
Recommended from our members
Systems analysis programs for Hands-on integrated reliability evaluations (SAPHIRE) Version 5.0: Verification and validation (V&V) manual. Volume 9
A verification and validation (V&V) process has been performed for the System Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE) Version 5.0. SAPHIRE is a set of four computer programs that NRC developed for performing probabilistic risk assessments. They allow an analyst to perform many of the functions necessary to create, quantify, and evaluate the risk associated with a facility or process being analyzed. The programs are Integrated Reliability and Risk Analysis System (IRRAS) System Analysis and Risk Assessment (SARA), Models And Results Database (MAR-D), and Fault tree, Event tree, and Piping and instrumentation diagram (FEP) graphical editor. Intent of this program is to perform a V&V of successive versions of SAPHIRE. Previous efforts have been the V&V of SAPHIRE Version 4.0. The SAPHIRE 5.0 V&V plan is based on the SAPHIRE 4.0 V&V plan with revisions to incorporate lessons learned from the previous effort. Also, the SAPHIRE 5.0 vital and nonvital test procedures are based on the test procedures from SAPHIRE 4.0 with revisions to include the new SAPHIRE 5.0 features as well as to incorporate lessons learned from the previous effort. Most results from the testing were acceptable; however, some discrepancies between expected code operation and actual code operation were identified. Modifications made to SAPHIRE are identified
A mixed treatment comparison meta-analysis of metaphylaxis treatments for bovine respiratory disease in beef cattle
Citation: Abell, K. M., Theurer, M. E., Larson, R. L., White, B. J., & Apley, M. (2017). A mixed treatment comparison meta-analysis of metaphylaxis treatments for bovine respiratory disease in beef cattle. Journal of Animal Science, 95(2), 626-635. doi:10.2527/jas2016.1062The objective of this project was to evaluate the effects of antimicrobials approved for parenteral metaphylactic use in feeder and stocker calves on morbidity and mortality for bovine respiratory disease with the use of a mixed treatment comparison meta-analysis. An initial literature review was conducted in April 2016 through Pubmed, Agricola, and CAB (Commonwealth Agricultural Bureau) for randomized controlled trials for metaphylaxis antimicrobial administered parentally to incoming feedlot or stocker calves within 48 h of arrival. The final list of publications included 29 studies, with a total of 37 trials. There were 8 different metaphylactic antimicrobials. Final event outcomes were categorized into bovine respiratory disease (BRD) morbidity cumulative incidence d 1 to <= 60 of the feeding period, BRD morbidity cumulative incidence d 1 to closeout of the feeding period, BRD mortality cumulative incidence d 1 to closeout of the feeding period, and BRD retreatment cumulative incidence morbidity d 1 to closeout of the feeding period. Network meta-analysis combined direct and indirect evidence for all the event outcomes to determine mean odds ratio (OR) with 95% credibility intervals (CrIs) for all metaphylactic antimicrobial comparisons. The "upper tier" treatment arms for morbidity d 1 to <= 60 included tulathromycin, gamithromycin, and tilmicosin. For BRD mortality cumulative incidence d 1 to closeout and BRD retreatment morbidity d 1 to closeout, classifying the treatment arms into tiers was not possible due to overlapping 95% CrIs. The results of this project accurately identified differences between metaphylactic antimicrobials, and metaphylactic antimicrobial options appear to offer different outcomes on BRD morbidity and mortality odds in feedlot cattle
Influence of fasting and transit stress on rumen fermentation in beef steers
Materials and methods; Results and discussion; Body weight and rectal temperature; Rumen parameters; ReferencesResearch report containing the results of a study to determine the effect of transit-related stress on rumination in beef cattle
STUDIES ON THE FEASIBILITY OF PREDICTING FEEDLOT PERFORMANCE FROM CERTAIN LABORATORY GRAIN ANALYSES
Data from 14 cattle feeding trials were utilized to study the relationship between several laboratory analyses and animal feed intake (INTAKE), gain (ADG) and feed efficiency (F/G). Laboratory analyses considered were 6, 12 and 24 hr in vitro dry matter disappearance (IV6, IV12, IV24, respectively); in vitro gas production in 1 hr and 6 hr (GP1 and GP6, respectively); and degree of gelatinization (GEL). A multiple regression equation with variables for treatment and trial classification, initial weight and the quadratic effect of initial weight was fit to the data. The effect of initial weight was significant for all three performance variables, and the quadratic effect was significant for ADG and F/G. A second model was fit excluding the treatment classification, and the maximum R2 procedure was utilized to examine how well laboratory analyses accounted for variation among residuals from this second model. More variation was accounted for in the dependent variables F/G (34.96%) and INTAKE (17.81%) than ADG (5.16%) when a combination of all laboratory analyses except GEL was included in the model. Moreover, correlations between residuals of the second model and the laboratory analyses were higher for INTAKE and F/G than ADG and were all negative for INTAKE and F/G, suggesting a negative response in intake and an improved F/G ratio as starch alteration increases. Correlations between the laboratory analyses were generally quite high. This study suggests that no single laboratory analysis considered would be useful for the development of accurate, reliable equations for the prediction of feedlot performance, and combinations appear to have value only in the case of FIG and INTAKE
Management Factors to Decrease Health Problems in Weaned Calves
Economic losses caused by morbidity and mortality from bovine respiratory disease (BRD) in newly weaned/received cattle are one of the most significant problems facing the beef cattle industry. In small feedlots (100 to 1,000 animals marketed annually) throughout the United States (USDA-APHIS, 1994), death losses ranged from 1.5 to 2.7 per 100 animals marketed, with greater losses in western than in central regions of the US. Two-thirds to three-quarters of these deaths were attributed to respiratory disease (USDA-APHIS, 1994). Two factors contribute to the high incidence of BRD in newly received, lightweight (e.g., \u3c 400 to 500 lb) cattle. First, stresses associated with weaning and transportation negatively impact the immune system (Blecha et al., 1984) at a time when the animal is often exposed to a variety of infectious agents as a result of marketing procedures. Second, feed intake by stressed calves is typically low (Cole, 1995), averaging approximately 1.5% of BW during the first 2 wk after arrival of lightweight feeder cattle (Galyean and Hubbert, 1995). This low feed, and thereby nutrient, intake may further impair immune function (Cole, 1995). Older (e.g., yearling) cattle typically have greater intake than lightweight cattle subjected to shipping stress, although outbreaks of BRD can still be a problem in older cattle. Practices that have been used to offset these negative factors that impact the health of newly received cattle include preconditioning (Cole, 1993), on-ranch vaccination programs (Parker et al., 1993), nutritional management, and prophylactic medication. This review will emphasize nutritional and prophylactic medication approaches and their effects on performance and health of newly weaned/received beef cattle
Effects of Graded Levels of Sorghum Wet Distiller’s Grains and Degraded Intake Protein Supply on Performance and Carcass Characteristics of Feedlot Cattle Fed Steam-Flaked Corn-Based Diets
Two experiments evaluated different levels of sorghum wet distiller’s grains plus solubles (SWDG) and effects of increasing the degraded intake protein (DIP) concentration in diets containing SWDG on performance and carcass characteristics of feedlot cattle. In Exp. 1, 200 beef steers (average BW = 404 kg) were fed increasing levels of SWDG (0, 5, 10, and 15% of DM) and one level of corn wet distiller’s grains plus solubles (10% of DM), which replaced steamflaked corn in a high-concentrate diet. Final BW (P = 0.04) and overall ADG (P = 0.01) decreased linearly with increasing levels of SWDG. Increasing SWDG decreased overall G:F (P = 0.01), hot carcass weight (P \u3c 0.01), and LM area (P \u3c 0.01). No differences were observed in overall DMI (P = 0.15) or other carcass characteristics (P ≥ 0.09). Neither DMI nor G:F differed between corn wet distiller’s grains plus solubles and SWDG when fed as 10% of the dietary DM. In Exp. 2, 200 steers (average BW = 369 kg) were either fed a control diet without SWDG (8.4% DIP) or three 10% SWDG diets with no urea added or urea added at either 50% or 100% of the difference in the DIP concentration between the diet with no urea added and control diets. Final BW (P = 0.03), overall ADG (P = 0.04), and overall G:F (P = 0.05) were greater for cattle fed the control diet. A linear decrease was observed in overall DMI with increasing DIP (P = 0.02). Likewise, overall ADG decreased with increasing DIP (P = 0.08). Cattle fed the control diet had greater hot carcass weight (P = 0.03), fat thickness (P = 0.02), and yield grade (P = 0.01) than the average of those fed the 3 SWDG diets. Results from both experiments suggest decreased performance and carcass value with increasing levels of SWDG alone or combined with additional DIP. At 10% of the dietary DM, corn and sorghum wet distiller’s grains resulted in similar ADG and G:F
- …